Learning a class of large finite state machines with a recurrent neural network

نویسندگان

  • C. Lee Giles
  • Bill G. Horne
  • Tsungnan Lin
چکیده

-One o f the issues in any learning model is how it scales with problem size. The problem o f learning finite state machine (FSMs) from examples with recurrent neural networks has been extensively explored. However, these results are somewhat disappointing in the sense that the machines that can be learned are too small to be competitive with existing grammatical inference algorithms. We show that a type o f recurrent neural network (Narendra & Parthasarathy, 1990, IEEE Trans. Neural Networks, 1, 4-27) which has feedback but no hidden state neurons can learn a special type o f F S M called a finite memory machine (FMM) under certain constraints. These machines have a large number o f states (simulations are for 256 and 512 state FMMs) but have minimal order, relatively small depth and little logic when the F M M is implemented as a sequential machine, Keywords---Recurrent neural network, Finite state machine, Grammatical inference, Automata, Sequential machine, Memory, Temporal sequences, NNIIR, NARX.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Beyond Mealy Machines: Learning Translators with Recurrent Neural Networks

Recent work has shown that recurrent neural networks can be trained to behave as nite-state automata from samples of input strings and their corresponding outputs. However, most of the work has focused on training simple networks to behave as the simplest class of deterministic machines, Mealy (or Moore) machines. The class of translations that can be performed by these machines are very limite...

متن کامل

Stable Encoding of Finite-State Machines in Discrete-Time Recurrent Neural Nets with Sigmoid Units

There has been a lot of interest in the use of discrete-time recurrent neural nets (DTRNN) to learn finite-state tasks, with interesting results regarding the induction of simple finite-state machines from input-output strings. Parallel work has studied the computational power of DTRNN in connection with finite-state computation. This article describes a simple strategy to devise stable encodin...

متن کامل

Learning Document Image Features With SqueezeNet Convolutional Neural Network

The classification of various document images is considered an important step towards building a modern digital library or office automation system. Convolutional Neural Network (CNN) classifiers trained with backpropagation are considered to be the current state of the art model for this task. However, there are two major drawbacks for these classifiers: the huge computational power demand for...

متن کامل

Asynchronous translations with recurrent neural nets

In recent years, many researchers have explored the relation between discrete-time recurrent neural networks (DTRNN) and finite-state machines (FSMs) either by showing their computational equivalence or by training them to perform as finite-state recognizers from examples. Most of this work has focussed on the simplest class of deterministic state machines, that is deterministic finite automata...

متن کامل

An efficient one-layer recurrent neural network for solving a class of nonsmooth optimization problems

Constrained optimization problems have a wide range of applications in science, economics, and engineering. In this paper, a neural network model is proposed to solve a class of nonsmooth constrained optimization problems with a nonsmooth convex objective function subject to nonlinear inequality and affine equality constraints. It is a one-layer non-penalty recurrent neural network based on the...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • Neural Networks

دوره 8  شماره 

صفحات  -

تاریخ انتشار 1995